Apple is introducing a new feature to iMessage aimed at enhancing digital safety for children. Initially available in beta only in Australia, this update will allow children to report inappropriate content directly to Apple representatives.
iMessage Will Be a Much Safer Platform
With iOS 17, Apple is taking the communication safety measures for users under 13 one step further. The new system allows iPhones to automatically detect images and videos containing nudity and intervene when necessary.
This safety system works across iMessage, AirDrop, FaceTime, and Photos apps. To protect user privacy, detection is performed entirely on-device. When inappropriate content is identified, young users are presented with a two-step warning screen and the option to contact a parent or guardian before viewing the content.
While Apple is testing this feature in Australia, it plans to roll it out globally in the future. The new feature follows Australia’s recent laws requiring tech companies to monitor child abuse and terrorism-related content by the end of 2024, which has already received positive feedback nationwide.
What do you think about this? Do you believe a similar feature should be implemented in our country? Don’t forget to share your thoughts.
{{user}} {{datetime}}
{{text}}